|
|
Registros recuperados : 13 | |
5. | | OSCO, L. P.; ARRUDA, M. S.; GONÇALVES, D. N.; DIAS, A.; BATISTOTI, J.; SOUZA, M.; GOMES, F. D. G.; RAMOS, A. P. M.; JORGE, L. A. de C.; LIESENBERG, V.; LI, J.; MA, L.; MARCATO JUNIOR, J.; GONÇALVES, W. N. A CNN approach to simultaneously count plants and detect plantation-rows from UAV imagery. ISPRS Journal of Photogrammetry and Remote Sensing, v. 174, 2021. 1 - 17 Biblioteca(s): Embrapa Instrumentação. |
| |
6. | | OSCO, L. P.; MARCATO JUNIOR, J.; RAMOS, A. P. M.; JORGE, L. A. de C.; FATHOLAHI, S. N.; SILVA, J. A.; MATSUBARA, E. T.; PISTORI, H.; GONÇALVES, W. N.; LI, J. A review on deep learning in UAV remote sensing. International Journal of Applied Earth Observations and Geoinformation, v. 102, 102456, 2021. 1 - 22 Biblioteca(s): Embrapa Instrumentação. |
| |
7. | | OSCO, L. P.; NOGUEIRA, K.; RAMOS, A. P. M.; PINHEIRO, M. M. F.; FURUYA, D. E. G.; GONÇALVES, W. N.; JORGE, L. A. de C.; MARCATO JUNIOR, J.; SANTOS, J. A. Semantic segmentation of citrus-orchard using deep neural networks and multispectral UAV-based imagery. Precision Agriculture, v. 22, n. 4,2021. 1171-1188 Biblioteca(s): Embrapa Instrumentação. |
| |
8. | | RAMOS, A. P. M.; GOMES, F. D. G.; PINHEIRO, M. M. F.; FURUYA, D. E. G.; GONÇALVEZ, W. N.; MARCATO JUNIOR, J.; MICHEREFF, M. F. F.; MORAES, M. C. B.; BORGES, M.; LAUMANN, R. A.; LIESENBERG, V.; JORGE, L. A. de C.; OSCO, L. P. Detecting the attack of the fall armyworm (Spodoptera frugiperda) in cotton plants with machine learning and spectral measurements. Precision Agriculture, 2021. Na publicação: Maria Carolina Blassioli-Moraes; Raúl Alberto Alaumann. Biblioteca(s): Embrapa Instrumentação; Embrapa Recursos Genéticos e Biotecnologia. |
| |
9. | | OSCO, L. P.; FURUYA, D. E. G.; FURUYA, M. T. G.; CORRÊA, D. V.; GONÇALVEZ, W. N.; MARCATO JUNIOR, J.; BORGES, M.; BLASSIOLI-MORAES, M. C.; MICHEREFF, M. F. F.; AQUUINO, M. F. S.; LAUMANN, R. A.; LISENBERG, V.; RAMOS, A. P. M.; JORGE, L. A. de C. An impact analysis of pre-processing techniques in spectroscopy data to classify insect-damaged in soybean plants with machine and deep learning methods. Infrared Physics & Technology, v. 123, 104203, 2022. 13 p. Biblioteca(s): Embrapa Instrumentação. |
| |
10. | | OSCO, L. P.; FURUYA, D. E. G.; FURUYA, M. T. G.; CORRÊA, D. V.; GONÇALVEZ, W. N.; MARCATO JUNIOR, J.; BORGES, M.; MORAES, M. C. B.; MICHEREFF, M. F. F.; AQUINO, M. F. S.; LAUMANN, R. A.; LISENBERG, V.; RAMOS, A. P. M.; JORGE, L. A. de C. An impact analysis of pre-processing techniques in spectroscopy data to classify insect-damaged in soybean plants with machine and deep learning methods. Infrared Physics & Technology, v. 123, 2022. 104203. Na publicação: Maria Carolina Blassioli-Moraes. Biblioteca(s): Embrapa Recursos Genéticos e Biotecnologia. |
| |
11. | | OSCO, L. P.; RAMOS, A. P. M.; PINHEIRO, M. M. F.; MORIYA, E. A. S.; IMAI, N. N.; ESTRABIS, N.; IANCZYK, F.; ARAÚJO, F. F.; LIESENBERG, V.; JORGE, L. A. de C.; LI, J.; MA, L.; GONÇALVES, W. N.; MARCATO JUNIOR, J.; CRESTE, J. E. A machine learning framework to predict nutrient content in valencia-orange leaf hyperspectral measurements. Remote Sensing, n. 12, v. 6, a. 906, 2020. 1 - 21 Biblioteca(s): Embrapa Instrumentação. |
| |
12. | | FURUYA, D. E. G.; MA, L.; PINHEIRO, M. M. F.; GOMES, F. D. G.; GONÇALVEZ, W. N.; MARCATO JUNIOR, J.; RODRIGUES, D. de C.; BLASSIOLI- MORAES, M. C.; MICHEREFF, M. F. F.; BORGES, M.; ALAUMANN, R. A.; FERREIRA, E. J.; OSCO, L. P.; RAMOS, A. P. M.; LI, J.; JORGE, L. A. de C. Prediction of insect-herbivory-damage and insect-type attack in maize plants using hyperspectral data. International Journal of Applied Earth Observation and Geoinformation, v. 105, 102608, 2021. 1 - 10 Biblioteca(s): Embrapa Instrumentação. |
| |
13. | | FURUYA, D. E. G.; MA, L.; PINHEIRO, M. M. F.; GOMES, F. D. G.; GONÇALVEZ, W. N.; MARCATO JUNIOR, J.; RODRIGUES, D. de C.; BLASSIOLI- MORAES, M. C.; MICHEREFF, M. F. F.; BORGES, M.; LAUMANN, R. A.; FERREIRA, E. J.; OSCO, L. P.; RAMOS, A. P. M.; LI, J.; JORGE, L. A. de C. Prediction of insect-herbivory-damage and insect-type attack in maize plants using hyperspectral data. International Journal of Applied Earth Observation and Geoinformation, v. 105, 102608, 2021. 1 - 10 Biblioteca(s): Embrapa Recursos Genéticos e Biotecnologia. |
| |
Registros recuperados : 13 | |
|
|
| Acesso ao texto completo restrito à biblioteca da Embrapa Instrumentação. Para informações adicionais entre em contato com cnpdia.biblioteca@embrapa.br. |
Registro Completo
Biblioteca(s): |
Embrapa Instrumentação. |
Data corrente: |
12/04/2021 |
Data da última atualização: |
16/08/2022 |
Tipo da produção científica: |
Artigo em Periódico Indexado |
Circulação/Nível: |
A - 1 |
Autoria: |
OSCO, L. P.; NOGUEIRA, K.; RAMOS, A. P. M.; PINHEIRO, M. M. F.; FURUYA, D. E. G.; GONÇALVES, W. N.; JORGE, L. A. de C.; MARCATO JUNIOR, J.; SANTOS, J. A. |
Afiliação: |
LUCIO ANDRE DE CASTRO JORGE, CNPDIA. |
Título: |
Semantic segmentation of citrus-orchard using deep neural networks and multispectral UAV-based imagery. |
Ano de publicação: |
2021 |
Fonte/Imprenta: |
Precision Agriculture, v. 22, n. 4,2021. |
Páginas: |
1171-1188 |
DOI: |
https://doi.org/10.1007/s11119-020-09777-5 |
Idioma: |
Inglês |
Conteúdo: |
Accurately mapping farmlands is important for precision agriculture practices. Unmanned aerial vehicles (UAV) embedded with multispectral cameras are commonly used to map plants in agricultural landscapes. However, separating plantation felds from the remaining objects in a multispectral scene is a difcult task for traditional algorithms. In this connection, deep learning methods that perform semantic segmentation could help improve the overall outcome. In this study, state-of-the-art deep learning methods to semantic segment citrus-trees in multispectral images were evaluated. For this purpose, a multispectral camera that operates at the green (530–570 nm), red (640–680 nm), red-edge (730–740 nm) and also near-infrared (770–810 nm) spectral regions was used. The performance of the following fve state-of-the-art pixelwise methods were evaluated: fully convolutional network (FCN), U-Net, SegNet, dynamic dilated convolution network (DDCN) and DeepLabV3+. The results indicated that the evaluated methods performed similarly in the proposed task, returning F1-Scores between 94.00% (FCN and U-Net) and 94.42% (DDCN). It was also determined the inference time needed per area and, although the DDCN method was slower, based on a qualitative analysis, it performed better in highly shadow-afected areas. This study demonstrated that the semantic segmentation of citrus orchards is highly achievable with deep neural networks. The state-of-the-art deep learning methods investigated here proved to be equally suitable to solve this task, providing fast solutions with inference time varying from 0.98 to 4.36 min per hectare. This approach could be incorporated into similar research, and contribute to decision-making and accurate mapping of plantation felds. MenosAccurately mapping farmlands is important for precision agriculture practices. Unmanned aerial vehicles (UAV) embedded with multispectral cameras are commonly used to map plants in agricultural landscapes. However, separating plantation felds from the remaining objects in a multispectral scene is a difcult task for traditional algorithms. In this connection, deep learning methods that perform semantic segmentation could help improve the overall outcome. In this study, state-of-the-art deep learning methods to semantic segment citrus-trees in multispectral images were evaluated. For this purpose, a multispectral camera that operates at the green (530–570 nm), red (640–680 nm), red-edge (730–740 nm) and also near-infrared (770–810 nm) spectral regions was used. The performance of the following fve state-of-the-art pixelwise methods were evaluated: fully convolutional network (FCN), U-Net, SegNet, dynamic dilated convolution network (DDCN) and DeepLabV3+. The results indicated that the evaluated methods performed similarly in the proposed task, returning F1-Scores between 94.00% (FCN and U-Net) and 94.42% (DDCN). It was also determined the inference time needed per area and, although the DDCN method was slower, based on a qualitative analysis, it performed better in highly shadow-afected areas. This study demonstrated that the semantic segmentation of citrus orchards is highly achievable with deep neural networks. The state-of-the-art deep learning methods investigated here pro... Mostrar Tudo |
Palavras-Chave: |
Convolutional neural network; Thematic map. |
Categoria do assunto: |
-- |
Marc: |
LEADER 02591naa a2200265 a 4500 001 2131208 005 2022-08-16 008 2021 bl uuuu u00u1 u #d 024 7 $ahttps://doi.org/10.1007/s11119-020-09777-5$2DOI 100 1 $aOSCO, L. P. 245 $aSemantic segmentation of citrus-orchard using deep neural networks and multispectral UAV-based imagery.$h[electronic resource] 260 $c2021 300 $a1171-1188 520 $aAccurately mapping farmlands is important for precision agriculture practices. Unmanned aerial vehicles (UAV) embedded with multispectral cameras are commonly used to map plants in agricultural landscapes. However, separating plantation felds from the remaining objects in a multispectral scene is a difcult task for traditional algorithms. In this connection, deep learning methods that perform semantic segmentation could help improve the overall outcome. In this study, state-of-the-art deep learning methods to semantic segment citrus-trees in multispectral images were evaluated. For this purpose, a multispectral camera that operates at the green (530–570 nm), red (640–680 nm), red-edge (730–740 nm) and also near-infrared (770–810 nm) spectral regions was used. The performance of the following fve state-of-the-art pixelwise methods were evaluated: fully convolutional network (FCN), U-Net, SegNet, dynamic dilated convolution network (DDCN) and DeepLabV3+. The results indicated that the evaluated methods performed similarly in the proposed task, returning F1-Scores between 94.00% (FCN and U-Net) and 94.42% (DDCN). It was also determined the inference time needed per area and, although the DDCN method was slower, based on a qualitative analysis, it performed better in highly shadow-afected areas. This study demonstrated that the semantic segmentation of citrus orchards is highly achievable with deep neural networks. The state-of-the-art deep learning methods investigated here proved to be equally suitable to solve this task, providing fast solutions with inference time varying from 0.98 to 4.36 min per hectare. This approach could be incorporated into similar research, and contribute to decision-making and accurate mapping of plantation felds. 653 $aConvolutional neural network 653 $aThematic map 700 1 $aNOGUEIRA, K. 700 1 $aRAMOS, A. P. M. 700 1 $aPINHEIRO, M. M. F. 700 1 $aFURUYA, D. E. G. 700 1 $aGONÇALVES, W. N. 700 1 $aJORGE, L. A. de C. 700 1 $aMARCATO JUNIOR, J. 700 1 $aSANTOS, J. A. 773 $tPrecision Agriculture$gv. 22, n. 4,2021.
Download
Esconder MarcMostrar Marc Completo |
Registro original: |
Embrapa Instrumentação (CNPDIA) |
|
Biblioteca |
ID |
Origem |
Tipo/Formato |
Classificação |
Cutter |
Registro |
Volume |
Status |
Fechar
|
Nenhum registro encontrado para a expressão de busca informada. |
|
|